Embedding Entities and Relations for Learning and Inference in Knowledge Bases
نویسندگان
چکیده
We consider learning representations of entities and relations in KBs using the neural-embedding approach. We show that most existing models, including NTN (Socher et al., 2013) and TransE (Bordes et al., 2013b), can be generalized under a unified learning framework, where entities are low-dimensional vectors learned from a neural network and relations are bilinear and/or linear mapping functions. Under this framework, we compare a variety of embedding models on the link prediction task. We show that a simple bilinear formulation achieves new state-of-the-art results for the task (achieving a top-10 accuracy of 73.2% vs. 54.7% by TransE on Freebase). Furthermore, we introduce a novel approach that utilizes the learned relation embeddings to mine logical rules such as BornInCitypa, bq ^ CityInCountrypb, cq ùñ Nationalitypa, cq. We find that embeddings learned from the bilinear objective are particularly good at capturing relational semantics, and that the composition of relations is characterized by matrix multiplication. More interestingly, we demonstrate that our embedding-based rule extraction approach successfully outperforms a state-ofthe-art confidence-based rule mining approach in mining Horn rules that involve compositional reasoning.
منابع مشابه
Ing and Inference in Knowledge Bases
We consider learning representations of entities and relations in KBs using the neural-embedding approach. We show that most existing models, including NTN (Socher et al., 2013) and TransE (Bordes et al., 2013b), can be generalized under a unified learning framework, where entities are low-dimensional vectors learned from a neural network and relations are bilinear and/or linear mapping functio...
متن کاملJointly Embedding Relations and Mentions for Knowledge Population
This paper contributes a joint embedding model for predicting relations between a pair of entities in the scenario of relation inference. It differs from most standalone approaches which separately operate on either knowledge bases or free texts. The proposed model simultaneously learns low-dimensional vector representations for both triplets in knowledge repositories and the mentions of relati...
متن کاملAnalogical Inference for Multi-relational Embeddings
Large-scale multi-relational embedding refers to the task of learning the latent representations for entities and relations in large knowledge graphs. An effective and scalable solution for this problem is crucial for the true success of knowledgebased inference in a broad range of applications. This paper proposes a novel framework for optimizing the latent representations with respect to the ...
متن کاملEmbedViz - Graph Visualization of Learned Structured Embeddings of Knowledge Bases
In order to gather, organize, and make deliberate use of the massive amounts of information generated daily, special kinds of web-based relational database specifically designed for knowledge management, collection, and retrieval called Knowledge Bases (KB)s have been built. The problem is, the data in these KBs is often locked up in a specific format making it difficult for them to be accessed...
متن کاملStructured Embedding via Pairwise Relations and Long-Range Interactions in Knowledge Base
We consider the problem of embedding entities and relations of knowledge bases into low-dimensional continuous vector spaces (distributed representations). Unlike most existing approaches, which are primarily efficient for modelling pairwise relations between entities, we attempt to explicitly model both pairwise relations and long-range interactions between entities, by interpreting them as li...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- CoRR
دوره abs/1412.6575 شماره
صفحات -
تاریخ انتشار 2014